An Effective Online Sequential Stochastic Configuration Algorithm for Neural Networks

نویسندگان

چکیده

Random Vector Functional-link (RVFL) networks, as a class of random learner models, have received careful attention from the neural network research community due to their advantages in obtaining fast learning algorithms and which hidden layer parameters are randomly generated remain fixed during training phase. However, its universal approximation ability may not be guaranteed if properly selected an appropriate range. Moreover, resulting learner’s generalization performance seriously deteriorate once RVFL network’s structure is well-designed. Stochastic configuration (SC) algorithm, incrementally constructs approximator by under specified supervisory mechanism, instead fixing selection scope advance without any reference information, can effectively circumvent these awkward issues caused randomness. This paper extends SC algorithm online sequential version, termed OSSC means recursive least square (RLS) technique, aiming copy with modeling tasks where observations sequentially provided. Compared networks (OS-RVFL short), our proposed avoid setting certain unreasonable range for parameters, also successfully build preferable capabilities. The experimental study has shown effectiveness algorithm.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Stochastic Sequential Neural Networks with Structured Inference

Unsupervised structure learning in high-dimensional time series data has attracted a lot of research interests. For example, segmenting and labelling high dimensional time series can be helpful in behavior understanding and medical diagnosis. Recent advances in generative sequential modeling have suggested to combine recurrent neural networks with state space models (e.g., Hidden Markov Models)...

متن کامل

An effective algorithm for hyperparameter optimization of neural networks

A major challenge in designing neural network (NN) systems is to determine the best structure and parameters for the network given the data for the machine learning problem at hand. Examples of parameters are the number of layers and nodes, the learning rates, and the dropout rates. Typically, these parameters are chosen based on heuristic rules and manually fine-tuned, which may be very time-c...

متن کامل

An effective configuration learning algorithm for entity resolution

Entity resolution is the problem of finding co-referent instances, which at the same time describe the same topic. It is an important component of data integration systems and is indispensable in linked data publication process. Entity resolution has been a subject of extensive research; however, seeking for a perfect resolution algorithm remains a work in progress. Many approaches have been pr...

متن کامل

Batch-Sequential Algorithm for Neural Networks Trained with Entropic Criteria

The use of entropy as a cost function in the neural network learning phase usually implies that, in the back-propagation algorithm, the training is done in batch mode. Apart from the higher complexity of the algorithm in batch mode, we know that this approach has some limitations over the sequential mode. In this paper we present a way of combining both modes when using entropic criteria. We pr...

متن کامل

A New Stochastic Learning Algorithm for Neural Networks

A new stochastic learning algorithm using Gaussian white noise sequence, referred to as Subconscious Noise Reaction (SNR), is proposed for a class of discrete-time neural networks with time-dependent connection weights. Unlike the back-propagation-through-time (BTT) algorithm, SNR does not require the synchronous transmission of information backward along connection weights, while it uses only ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Sustainability

سال: 2022

ISSN: ['2071-1050']

DOI: https://doi.org/10.3390/su142315601